Bennett-type Generalization Bounds: Large-deviation Case and Faster Rate of Convergence

نویسنده

  • Chao Zhang
چکیده

In this paper, we present the Bennett-type generalization bounds of the learning process for i.i.d. samples, and then show that the generalization bounds have a faster rate of convergence than the traditional results. In particular, we first develop two types of Bennett-type deviation inequality for the i.i.d. learning process: one provides the generalization bounds based on the uniform entropy number; the other leads to the bounds based on the Rademacher complexity. We then adopt a new method to obtain the alternative expressions of the Bennett-type generalization bounds, which imply that the bounds have a faster rate o(N− 1 2 ) of convergence than the traditional results O(N− 1 2 ). Additionally, we find that the rate of the bounds will become faster in the large-deviation case, which refers to a situation where the empirical risk is far away from (at least not close to) the expected risk. Finally, we analyze the asymptotical convergence of the learning process and compare our analysis with the existing results.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Generalization Bounds for Representative Domain Adaptation

In this paper, we propose a novel framework to analyze the theoretical properties of thelearning process for a representative type of domain adaptation, which combines data frommultiple sources and one target (or briefly called representative domain adaptation). Inparticular, we use the integral probability metric to measure the difference between the dis-tributions of two d...

متن کامل

Weak Gibbs Measures: Speed of Convergence to Entropy, Topological and Geometrical Aspects

Abstract. In this paper we obtain exponential large deviation bounds in the Shannon-McMillan-Breiman convergence formula for entropy in the case of weak Gibbs measures and topologically mixing subshifts of finite type. We also prove almost sure estimates for the error term in the convergence to entropy given by Shannon-McMillan-Breiman formula for both uniformly and non-uniformly expanding shif...

متن کامل

On the approximation by Chlodowsky type generalization of (p,q)-Bernstein operators

In the present article, we introduce Chlodowsky variant of $(p,q)$-Bernstein operators and compute the moments for these operators which are used in proving our main results. Further, we study some approximation properties of these new operators, which include the rate of convergence using usual modulus of continuity and also the rate of convergence when the function $f$ belongs to the class Li...

متن کامل

Generalization Bounds for the Area Under an ROC Curve

We study generalization properties of the area under an ROC curve (AUC), a quantity that has been advocated as an evaluation criterion for bipartite ranking problems. The AUC is a different and more complex term than the error rate used for evaluation in classification problems; consequently, existing generalization bounds for the classification error rate cannot be used to draw conclusions abo...

متن کامل

Generalization Bounds for the Area Under the ROC Curve

We study generalization properties of the area under the ROC curve (AUC), a quantity that has been advocated as an evaluation criterion for the bipartite ranking problem. The AUC is a different term than the error rate used for evaluation in classification problems; consequently, existing generalization bounds for the classification error rate cannot be used to draw conclusions about the AUC. I...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1309.6876  شماره 

صفحات  -

تاریخ انتشار 2013